Explore recent issues of Contract Pharma covering key industry trends.
Read the full digital version of our magazine online.
Behind every facility expansion, technology investment, and quality milestone in the CDMO sector is a leadership team making deliberate choices about where to focus, how to grow, and when to take calculated risks.
Stay informed! Subscribe to Contract Pharma for industry news and analysis.
Get the latest updates and breaking news from the pharmaceutical and biopharmaceutical industry.
Discover the newest partnerships and collaborations within the pharma sector.
Keep track of key executive moves and promotions in the pharma and biopharma industry.
Updates on the latest clinical trials and regulatory filings.
Stay informed with the latest financial reports and updates in the pharma industry.
A video roundup of the week’s top industry news stories.
Expert Q&A sessions addressing crucial topics in the pharmaceutical and biopharmaceutical world.
In-depth articles and features covering critical industry developments.
Access exclusive industry insights, interviews, and in-depth analysis.
Insights and analysis from industry experts on current pharma issues.
A one-on-one video interview between our editorial teams and industry leaders.
Listen to expert discussions and interviews in pharma and biopharma.
Contract Pharma Stream offers a centralized destination where users can watch expert-led sessions anytime, anywhere
A detailed look at the leading US players in the global pharmaceutical and BioPharmaceutical industry.
Browse companies involved in pharmaceutical manufacturing and services.
Comprehensive company profiles featuring overviews, key statistics, services, and contact details.
A comprehensive glossary of terms used in the pharmaceutical and biopharmaceutical industry.
Watch in-depth videos featuring industry insights and developments.
Download in-depth eBooks covering various aspects of the pharma industry.
Access detailed whitepapers offering analysis on industry topics.
View and download brochures from companies in the pharmaceutical sector.
Explore content sponsored by industry leaders, providing valuable insights.
Stay updated with the latest press releases from pharma and biopharma companies.
Explore top companies showcasing innovative pharma solutions.
Meet the leaders driving innovation and collaboration.
Engage with sessions and panels on pharma’s key trends.
Hear from experts shaping the pharmaceutical industry.
Join online webinars discussing critical industry topics and trends.
A comprehensive calendar of key industry events around the globe.
Live coverage and updates from major pharma and biopharma shows.
Find advertising opportunities to reach your target audience with Contract Pharma.
Review the editorial standards and guidelines for content published on our site.
Understand how Contract Pharma handles your personal data.
View the terms and conditions for using the Contract Pharma website.
What are you searching for?
All that’s Old is New Again.
March 1, 2021
By: Emil W. Ciurczak
Independent Pharmaceuticals Professional
I have been in the analytical chem “biz” for a pretty long time. I have seen the trends and re-designs of equipment for over 50 years. When commercial HPLC equipment was introduced circa 1970-ish, it was quite primitive, when compared with the excellent units offered today. But, then, the Model T is less sophisticated than a Taurus, but we had to start somewhere. The earliest I worked with (by Waters) was an integrated unit using air pressure in lieu of a piston pump to move the mobile phase—hence the term “high pressure.” Later, in the 1980s, we moved to individual modules: pumps, injectors, detectors, etc. This made maintenance and parts replacement easier, but the footprint became inconvenient. At Sandoz, we constructed mini-platforms to locate the injectors over the pumps to reduce the footprint, using solenoids to switch solvents, controlled by our archaic LIMS system. Eventually, by the end of the 1980s, we saw the migration back to self-contained units, enveloping all the needed parts for analysis. A similar yo-yo engineering trend(s) occurred in spectroscopy. The early instruments were single-beam units, meaning that each sample measurement needed to be run with a concurrent blank using the same size cuvette and solvent. The Absorbance values—if I need to explain what Absorbance is, skip to the next article, you’re on the wrong page—were then based on the ratio of each set of spectra. Since the early units could drift somewhat quickly, these “blanks” or references needed to be almost continuously run for comparison. However, compared with titrations, these measurements were so much better, the mechanical manipulations were accepted and a huge improvement. But, as a quantitative tool, it was great. For molecular specificity. Mid-range infrared was deemed “the way.” This “great disturbance in the Force” was felt when dual-beam instruments were introduced. The light beam was either run sequentially or concomitantly through the reference cell and the sample cell. Yet, high quality spectra still took care and time. For example, because the detector response and thus, relaxation time, was slow, the reference beam needed to be attenuated, often with a comb-shaped device, such that the sample and reference beams registered nearly the same levels in the single detector to avoid unwanted noise. Despite that, spectroscopy was greatly improved with dual-beam instruments. Of course, they were largely lab-based because of size and sensitivity to vibrations (e.g., being moved). For example, back in the 1970s, it could take up to 22-plus minutes to generate a high-quality infrared spectrum, using first, our (older) prism, then grating-based units. “Quick-and-dirty” spectra could be produced in merely 4-5 minutes and that was often “good enough for government work” as we were wont to say. Fortunately, the age of computers was dawning and better days were ahead, if only allowing digital spectra to be compared. It wasn’t until the interferometer came around that fast IR spectra were possible, but, like the Hope Diamond, it came with a curse: it was (again) a single-beam instrument and needed a reference. The resultant spectra were generated by impinging the entire spectrum of light on the sample to increase sensitivity, and the first instruments were the size of Cooper Minis and needed as much care and maintenance as early Model T Fords. The success of fast monochromators was nearly concomitant with the “popping” of the telecom industry bubble. A number of “silver linings” were found in the telecom bust: 1. The telecom companies had been acquiring Indium Gallium Arsenide (InGaAs) detectors, making the price to small instrument companies prohibitive. A InGaAs died array could cost up to $10,000 apiece, making the cost of a NIR spectrometer more than most companies were willing to buy. These particular detectors were coveted by the telecom industry because their speed and low noise allowed them to be used as repeaters for long distance communications lines, such as undersea cables. When the industry collapsed, these detectors dropped in price, in some cases, to as low as $10 for a single detector to $100 for an array. Suddenly, faster, less expensive, quieter units could be produced at a reasonable cost. 2. All along the tech corridor outside of Boston, multiple small suppliers of components for telecom were wringing their collective hands. Imagine the feeling of steering wheel suppliers if cars were outlawed. Again, silver linings abounded: a. The fiber optic suppliers found that their products, geared to visible to near, near (or short-range) infrared (800 – 1000 nm; 12,500 – 10,000 cm-1) could be improved to pass radiation up to 2500nm (not the end of the NIR, but the extent of the power sources, quartz-halogen lamps). This made for a boon in fiber optics for NIR applications. b. The heartbeat of the voice signal repeaters were MEMS (Micro-Electro-Mechanical Systems: devices that can vary from relatively simple structures having no moving elements, to extremely complex electromechanical systems with multiple moving elements under the control of integrated microelectronics, used to regulate wavelengths). There were/are several small-ish companies that made these devices for retransmission of voice and video signals along the fiber optic networks. Their choices, after the bubble burst, were 1) close and 2) find another customer. They mostly chose the latter, finding that their units were quite good as replacements for larger, conventional interferometers. This led to the construction of true portable spectrometers. All this hardware was being developed in parallel with fast, powerful computing systems. When fast spectrometers were paired with powerful computers, equipped with modern algorithms through WiFi connections, a revolution in process control was set to happen. Size wasn’t the concern when the first on-board NIR instrument was developed by Pfizer and Zeiss in the late 1980s, but, it did coincide with growing interest by the U.S. FDA in on-line monitoring and control. After all, the V-blenders or bin blenders for which the wireless, self-contained power-supplied instrument. Adding a few kilos to the massive steel blender, containing potentially hundreds of kilos of powders, made little difference in operation of the blender. A similar lack of need for miniaturization was seen in early PAT (Process Analytical Technology) applications. It wasn’t important how large the “mother-instrument” was, as long as fiber optic probes could lead to the point of analysis. Want to qualify incoming raw materials? Place a hardened unit (usually NIR) on a cart, wheel it into the warehouse or loading dock, open each container, and measure. Want to follow the blend uniformity in a hopper? Same solution: fiber probes in the stream. To measure the final product, in almost all cases, individual tablets, vials, or capsules were taken off the line and measured in a stationary, near-line unit. For “standard” PAT work, this was sufficient. However, as QbD (Quality by Design) became popular (urged on by the FDA), merely measuring parts of the process, to be used at leisure, wasn’t good enough. The QbD paradigm of “measure AND control” demanded both more measurements, taken at a higher speed, and the ability of these measurements to control the process in a meaningful time frame. It quickly became apparent that the larger, (relatively) slower, and more expensive instruments weren’t exactly what was needed. The PAT/QbD duo was joined, several years ago, by Continuous Manufacturing (CM) and immediately placed greater demands on the instrumentation. For those of you returning from a space voyage, CM is based upon, strangely enough, a continuous stream of raw materials (including the API) travelling through a series of connected process equipment: weighing (proportioning the ingredients in the proper ratio), blending (often a screw-type), possibly a continuous granulator, drying, lubricating, compressing, and coating, all designed to resemble an automated car wash. The powders go in; the tablets come out. BUT, to run such a line for, potentially, days at a time, you need real-time measurements and immediate feed-back/control of each process. That is each weighing/dispensing point (the number depends on the number of materials in any dosage form) needs a monitor; the blender needs a minimum of two (beginning and end, at least) to assure proper blend uniformity; the granulator and drier need monitoring units to assure proper granulation and drying (and lubrication); the tablet press needs (at least) one unit (perhaps with multiple fiber probes) to monitor (up to) 199% of tablets made. Of course, coating pans have had Raman or NIR monitors for decades, so that is a given application. Quite clearly, there is no desire to spend more for the monitoring instruments than the equipment used to produce the product. This is where the small spectrometers come in. The number of available units are quite numerous. What needs to be decided is what needs to be measured (CPPs), where to measure it, and how often does it need to be measured. Also, we need to determine the complexity of the measurement made. That is, we would not need a complex instrument with sophisticated software if were merely want to do a simple moisture level or average particle size by diffuse reflection. Such a device could be a simple one or two wavelength (filter?) device and quite inexpensive. As the needs increase so will the complexity and (gasp!) price. But even an “expensive” mini-device is nowhere near as large or expensive as the traditional models. I believe that it would be quite “do-able” to have numerous small units, connected through sophisticated algorithms, monitoring and controlling the production of drug products. In the long run, the costs will be less (returned lots, rejected lots, etc.). Will there be growing pains? Surely. Will we need different training and modes of validation? Of course. Will it be worth it? You bet! The eventual COGS will drop, in no small way due to the smaller foot print and lower downtimes of the CM equipment, and quality of the product.
Enter the destination URL
Or link to existing content
Enter your account email.
A verification code was sent to your email, Enter the 6-digit code sent to your mail.
Didn't get the code? Check your spam folder or resend code
Set a new password for signing in and accessing your data.
Your Password has been Updated !